Twitter and Tear Gas by Zeynep Tufekci

Twitter and Tear Gas by Zeynep Tufekci

Author:Zeynep Tufekci
Language: eng
Format: epub
Publisher: Yale University Press
Published: 2017-09-03T04:00:00+00:00


The proprietary, opaque, and personalized nature of algorithmic control on the web also makes it difficult even to understand what drives visibility on platforms, what is seen by how many people, and how and why they see it. Broadcast television can be monitored by anyone to see what is being covered and what is not, but the individualized algorithmic feed or search results are visible only to their individual users. This creates a double challenge: if the content a social movement is trying to disseminate is not being shared widely, the creators do not know whether the algorithm is burying it, or whether their message is simply not resonating.

If the nightly television news does not cover a protest, the lack of coverage is evident for all to see and even to contest. In Turkey, during the Gezi Park protests, lack of coverage on broadcast television networks led to protests: people marched to the doors of the television stations and demanded that the news show the then-widespread protests. However, there is no transparency in algorithmic filtering: how is one to know whether Facebook is showing Ferguson news to everyone else but him or her, whether there is just no interest in the topic, or whether it is the algorithmic feedback cycle that is depressing the updates in favor of a more algorithm-friendly topic, like the ALS charity campaign?

Algorithmic filtering can produce complex effects. It can result in more polarization and at the same time deepen the filter bubble.44 The bias toward “Like” on Facebook promotes the echo-chamber effect, making it more likely that one sees posts one already agrees with. Of course, this builds upon the pre-existing human tendency to gravitate toward topics and positions one already agrees with—confirmation bias—which is well demonstrated in social science research. Facebook’s own studies show that the algorithm contributes to this bias by making the feed somewhat more tilted toward one’s existing views, reinforcing the echo chamber.45

Another type of bias is “comment” bias, which can promote visibility for the occasional quarrels that have garnered many comments. But how widespread are these problems, and what are their effects? It is hard to study any of this directly because the data are owned by Facebook—or, in the case of search, Google. These are giant corporations that control and make money from the user experience, and yet the impact of that experience is not accessible to study by independent researchers.

Social movement activists are greatly attuned to this issue. I often hear of potential tweaks to the algorithm of major platforms from activists who are constantly trying to reverse-engineer them and understand how to get past them. They are among the first people to notice slight changes. Groups like Upworthy have emerged to produce political content designed to be Facebook algorithm friendly and to go viral. However, this is not a neutral game. Just as attracting mass-media attention through stunts came with political costs, playing to the algorithm comes with political costs as well. Upworthy, for example, has ended



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.